Perceptron capacity revisited : classification ability for correlated patterns
نویسنده
چکیده
In this paper, we address the problem of how many randomly labeled patterns can be correctly classified by a single-layer perceptron when the patterns are correlated with each other. In order to solve this problem, two analytical schemes are developed based on the replica method and Thouless-Anderson-Palmer (TAP) approach by utilizing an integral formula concerning random rectangular matrices. The validity and relevance of the developed methodologies are shown for one known result and two example problems. A message-passing algorithm to perform the TAP scheme is also presented. Classification of correlated patterns by perceptrons 2
منابع مشابه
Neural Networks Revisited: a Statistical View on Optimisation and Generalisation
Statistical methods can be applied to analysis of Neural Networks to come up with on-average results for robustness, capacity, and generalisation in the presence of certain network architectures and data distributions. In particular, the dynamics of generalization and learning of the Adaline training algorithm are calculated for correlated patterns. Modified algorithms are derived which restore...
متن کاملStoring Block-wise Semantically Correlated Patterns in a Perceptron: Results from a Cavity Method
In this paper we calculate the storage capacity of the perceptron for block-wise semantically correlated patterns. The set of patterns is divided into blocks, each block consisting of n binary patterns. The patterns in a block have the overlap R with each other and patterns in diierent blocks have overlap 0. Using a cavity method, which was recently developed by F. Gerl 1], we derive a general ...
متن کاملHigh capacity recurrent associative memories
Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model. These alternative algorithms either iteratively approximate the projection weight matrix or use simple perceptron learning. An experimental investigation of the performance of networks trained by these algorithms is presented, in...
متن کاملStorage of correlated patterns in a perceptron
We calculate the storage capacity of a perceptron for correlated Gaussian patterns. We find that the storage capacity αc can be less than 2 if similar patterns are mapped onto different outputs and vice versa. As long as the patterns are in a general position we obtain, in contrast to previous works, that αc ≥ 1 in agreement with Cover’s theorem. Numerical simulations confirm the results. The c...
متن کاملLearning by dilution in a Neural Network
A perceptron with N random weights can store of the order of N patterns by removing a fraction of the weights without changing their strengths. The critical storage capacity as a function of the concentration of the remaining bonds for random outputs and for outputs given by a teacher perceptron is calculated. A simple Hebb– like dilution algorithm is presented which in the teacher case reaches...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008